162 research outputs found

    Ultrafast processing of pixel detector data with machine learning frameworks

    Full text link
    Modern photon science performed at high repetition rate free-electron laser (FEL) facilities and beyond relies on 2D pixel detectors operating at increasing frequencies (towards 100 kHz at LCLS-II) and producing rapidly increasing amounts of data (towards TB/s). This data must be rapidly stored for offline analysis and summarized in real time. While at LCLS all raw data has been stored, at LCLS-II this would lead to a prohibitive cost; instead, enabling real time processing of pixel detector raw data allows reducing the size and cost of online processing, offline processing and storage by orders of magnitude while preserving full photon information, by taking advantage of the compressibility of sparse data typical for LCLS-II applications. We investigated if recent developments in machine learning are useful in data processing for high speed pixel detectors and found that typical deep learning models and autoencoder architectures failed to yield useful noise reduction while preserving full photon information, presumably because of the very different statistics and feature sets between computer vision and radiation imaging. However, we redesigned in Tensorflow mathematically equivalent versions of the state-of-the-art, "classical" algorithms used at LCLS. The novel Tensorflow models resulted in elegant, compact and hardware agnostic code, gaining 1 to 2 orders of magnitude faster processing on an inexpensive consumer GPU, reducing by 3 orders of magnitude the projected cost of online analysis at LCLS-II. Computer vision a decade ago was dominated by hand-crafted filters; their structure inspired the deep learning revolution resulting in modern deep convolutional networks; similarly, our novel Tensorflow filters provide inspiration for designing future deep learning architectures for ultrafast and efficient processing and classification of pixel detector images at FEL facilities.Comment: 9 pages, 9 figure

    Not Just Fun and Games: A Review of College Drinking Games Research From 2004 to 2013

    Get PDF
    Drinking games are a high-risk social drinking activity consisting of rules and guidelines that determine when and how much to drink (Polizzotto et al., 2007). Borsari\u27s (2004) seminal review paper on drinking games in the college environment succinctly captured the published literature as of February 2004. However, research on college drinking games has grown exponentially during the last decade, necessitating an updated review of the literature. This review provides an in-depth summary and synthesis of current drinking games research (e.g., characteristics of drinking games, and behavioral, demographic, social, and psychological influences on participation) and suggests several promising areas for future drinking games research. This review is intended to foster a better understanding of drinking game behaviors among college students and improve efforts to reduce the negative impact of this practice on college campuses

    Strategies to Maximize Science Data Availability for the GOES-R Series of Satellites

    Get PDF
    The Geostationary Operational Environmental Satellite-R Series (GOES-R) is the next generation of Untied States geostationary weather satellites. The GOES-R series significantly improves the detection and observation of environmental phenomena that directly affect public safety, protection of property and the economic health and prosperity of the United States and all countries within the western hemisphere. Given the real-time or ''now-casting'' nature of the GOES science gathering mission, any data outage or interruption can reduce warning times or scientific fidelity for critical weather data. GOES-R mission level requirements limit key performance product outages to a total of six hours per year to maximize science data availability. Lower level requirement only allow for 120 minutes of disruption between the spacecraft bus interface to the instruments. This requirement is met using both design features of the satellite and ground system, in addition to operational strategies

    Engineering High-Fidelity Residue Separations for Selective Harvest

    Get PDF
    Composition and pretreatment studies of corn stover and wheat stover anatomical fractions clearly show that some corn and wheat stover anatomical fractions are of higher value than others as a biofeedstock. This premise, along with soil sustainability and erosion control concerns, provides the motivation for the selective harvest concept for separating and collecting the higher value residue fractions in a combine during grain harvest. This study recognizes the analysis of anatomical fractions as theoretical feedstock quality targets, but not as practical targets for developing selective harvest technologies. Rather, practical quality targets were established that identified the residue separation requirements of a selective harvest combine. Data are presented that shows that a current grain combine is not capable of achieving the fidelity of residue fractionation established by the performance targets. However, using a virtual engineering approach, based on an understanding of the fluid dynamics of the air stream separation, the separation fidelity can be significantly improved without significant changes to the harvester design. A virtual engineering model of a grain combine was developed and used to perform simulations of the residue separator performance. The engineered residue separator was then built into a selective harvest test combine, and tests performed to evaluate the separation fidelity. Field tests were run both with and without the residue separator installed in the test combine, and the chaff and straw residue streams were collected during harvest of Challis soft white spring wheat. The separation fidelity accomplished both with and without the residue separator was quantified by laboratory screening analysis. The screening results showed that the engineered baffle separator did a remarkable job of effecting high-fidelity separation of the straw and chaff residue streams, improving the chaff stream purity and increasing the straw stream yield

    The DEEP2 Galaxy Redshift Survey: The Voronoi-Delaunay Method Catalog of Galaxy Groups

    Get PDF
    We present a public catalog of galaxy groups constructed from the spectroscopic sample of galaxies in the fourth data release from the Deep Extragalactic Evolutionary Probe 2 (DEEP2) Galaxy Redshift Survey, including the Extended Groth Strip (EGS). The catalog contains 1165 groups with two or more members in the EGS over the redshift range 0 0.6 in the rest of DEEP2. Twenty-five percent of EGS galaxies and fourteen percent of high-z DEEP2 galaxies are assigned to galaxy groups. The groups were detected using the Voronoi-Delaunay method (VDM) after it has been optimized on mock DEEP2 catalogs following similar methods to those employed in Gerke et al. In the optimization effort, we have taken particular care to ensure that the mock catalogs resemble the data as closely as possible, and we have fine-tuned our methods separately on mocks constructed for the EGS and the rest of DEEP2. We have also probed the effect of the assumed cosmology on our inferred group-finding efficiency by performing our optimization on three different mock catalogs with different background cosmologies, finding large differences in the group-finding success we can achieve for these different mocks. Using the mock catalog whose background cosmology is most consistent with current data, we estimate that the DEEP2 group catalog is 72% complete and 61% pure (74% and 67% for the EGS) and that the group finder correctly classifies 70% of galaxies that truly belong to groups, with an additional 46% of interloper galaxies contaminating the catalog (66% and 43% for the EGS). We also confirm that the VDM catalog reconstructs the abundance of galaxy groups with velocity dispersions above ~300 km s^(–1) to an accuracy better than the sample variance, and this successful reconstruction is not strongly dependent on cosmology. This makes the DEEP2 group catalog a promising probe of the growth of cosmic structure that can potentially be used for cosmological tests

    L-Edge Spectroscopy of Dilute, Radiation-Sensitive Systems Using a Transition-Edge-Sensor Array

    Get PDF
    We present X-ray absorption spectroscopy and resonant inelastic X-ray scattering (RIXS) measurements on the iron L-edge of 0.5 mM aqueous ferricyanide. These measurements demonstrate the ability of high-throughput transition-edge-sensor (TES) spectrometers to access the rich soft X-ray (100-2000eV) spectroscopy regime for dilute and radiation-sensitive samples. Our low-concentration data are in agreement with high-concentration measurements recorded by conventional grating-based spectrometers. These results show that soft X-ray RIXS spectroscopy acquired by high-throughput TES spectrometers can be used to study the local electronic structure of dilute metal-centered complexes relevant to biology, chemistry and catalysis. In particular, TES spectrometers have a unique ability to characterize frozen solutions of radiation- and temperature-sensitive samples.Comment: 19 pages, 4 figure

    Gridded and direct Epoch of Reionisation bispectrum estimates using the Murchison Widefield Array

    Full text link
    We apply two methods to estimate the 21~cm bispectrum from data taken within the Epoch of Reionisation (EoR) project of the Murchison Widefield Array (MWA). Using data acquired with the Phase II compact array allows a direct bispectrum estimate to be undertaken on the multiple redundantly-spaced triangles of antenna tiles, as well as an estimate based on data gridded to the uvuv-plane. The direct and gridded bispectrum estimators are applied to 21 hours of high-band (167--197~MHz; zz=6.2--7.5) data from the 2016 and 2017 observing seasons. Analytic predictions for the bispectrum bias and variance for point source foregrounds are derived. We compare the output of these approaches, the foreground contribution to the signal, and future prospects for measuring the bispectra with redundant and non-redundant arrays. We find that some triangle configurations yield bispectrum estimates that are consistent with the expected noise level after 10 hours, while equilateral configurations are strongly foreground-dominated. Careful choice of triangle configurations may be made to reduce foreground bias that hinders power spectrum estimators, and the 21~cm bispectrum may be accessible in less time than the 21~cm power spectrum for some wave modes, with detections in hundreds of hours.Comment: 19 pages, 10 figures, accepted for publication in PAS

    Calibration database for the Murchison Widefield Array All-Sky Virtual Observatory

    Get PDF
    We present a calibration component for the Murchison Widefield Array All-Sky Virtual Observatory (MWA ASVO) utilising a newly developed PostgreSQL database of calibration solutions. Since its inauguration in 2013, the MWA has recorded over thirty-four petabytes of data archived at the Pawsey Supercomputing Centre. According to the MWA Data Access policy, data become publicly available eighteen months after collection. Therefore, most of the archival data are now available to the public. Access to public data was provided in 2017 via the MWA ASVO interface, which allowed researchers worldwide to download MWA uncalibrated data in standard radio astronomy data formats (CASA measurement sets or UV FITS files). The addition of the MWA ASVO calibration feature opens a new, powerful avenue for researchers without a detailed knowledge of the MWA telescope and data processing to download calibrated visibility data and create images using standard radio-astronomy software packages. In order to populate the database with calibration solutions from the last six years we developed fully automated pipelines. A near-real-time pipeline has been used to process new calibration observations as soon as they are collected and upload calibration solutions to the database, which enables monitoring of the interferometric performance of the telescope. Based on this database we present an analysis of the stability of the MWA calibration solutions over long time intervals.Comment: 12 pages, 9 figures, Accepted for publication in PAS

    Angular Momentum and the Formation of Stars and Black Holes

    Full text link
    The formation of compact objects like stars and black holes is strongly constrained by the requirement that nearly all of the initial angular momentum of the diffuse material from which they form must be removed or redistributed during the formation process. The mechanisms that may be involved and their implications are discussed for (1) low-mass stars, most of which probably form in binary or multiple systems; (2) massive stars, which typically form in clusters; and (3) supermassive black holes that form in galactic nuclei. It is suggested that in all cases, gravitational interactions with other stars or mass concentrations in a forming system play an important role in redistributing angular momentum and thereby enabling the formation of a compact object. If this is true, the formation of stars and black holes must be a more complex, dynamic, and chaotic process than in standard models. The gravitational interactions that redistribute angular momentum tend to couple the mass of a forming object to the mass of the system, and this may have important implications for mass ratios in binaries, the upper stellar IMF in clusters, and the masses of supermassive black holes in galaxies.Comment: Accepted by Reports on Progress in Physic
    • …
    corecore